Incrementality In Syntactic Processing: Computational Models And Experimental Evidence
نویسنده
چکیده
It is a well-known intuition that human sentence understanding works in an incremental fashion, with a seemingly constant update of the interpretation through the left-to-right processing of a string. Such intuitions are backed up by experimental evidence dating from at least as far back as Marslen-Wilson (1973), showing that under many circumstances, interpretations are indeed updated very quickly. From a parsing point of view it is interesting to consider the structure-building processes that might underlie incremental interpretation— what kinds of partial structures are built during sentence processing, and with what timecourse? In this talk I will give an overview of the stateof-the-art of experimental psycholinguistic research, paying particular attention to the timecourse of structure-building. The discussion will focus on a new line of research (some as yet unpublished) in which syntactic phenomena such as binding relations (e.g., Sturt, 2003) and unbounded dependencies (e.g., Aoshima, Phillips, & Weinberg, in press) are exploited to make a very direct test of the availability of syntactic structure over time. The experimental research will be viewed from the perspective of a space of computational models, which make different predictions about time-course of structure building. One dimension in this space is represented by the parsing algorithm used: For example, within the framework of Generalized Left Corner Parsing (Demers, 1977), algorithms can be characterized in terms of the point at which a context-free rule is recognized, in relation to the recognition-point of the symbols on its righthand side. Another relevant dimension is represented by the type of grammar formalism that is assumed. For example, with bottom-up parsing algorithms, the degree to which structurebuilding is delayed in right-branching structures depends heavily on whether we employ a traditional phrase-structure formalism with rigid constituency, or a cateogorial formalism with flexible constituency (e.g., Steedman, 2000). I will argue that the evidence is incompatible with models which predict systematic delays in the construction of syntactic structure. In particular, I will argue against both head-driven strategies (e.g., Mulders, 2002), and purely bottom-up parsing strategies, even when flexible constituency is employed. Instead, I will argue that to capture the data in the most parsimonious way, we should turn our attention to those models in which a fully connected syntactic structure is maintained throughout the processing of a string.
منابع مشابه
Information and Incrementality in Syntactic
Title of dissertation: INFORMATION AND INCREMENTALITY IN SYNTACTIC BOOTSTRAPPING Aaron Steven White, Doctor of Philosophy, 2015 Dissertation directed by: Professor Valentine Hacquard Department of Linguistics Some words are harder to learn than others. For instance, action verbs like run and hit are learned earlier than propositional attitude verbs like think and want. One reason think and want...
متن کاملIncrementality In Deterministic Dependency Parsing
Deterministic dependency parsing is a robust and efficient approach to syntactic parsing of unrestricted natural language text. In this paper, we analyze its potential for incremental processing and conclude that strict incrementality is not achievable within this framework. However, we also show that it is possible to minimize the number of structures that require nonincremental processing by ...
متن کاملWorking Memory Load in Sentence Parsing
This thesis presents a computational model of working memory load in human sentence processing that attempts to explain a number of well-established linguistic performance effects that are associated with processing difficulty of certain syntactic structures. The computational model presented here is based on recent assumptions of working memory in computational cognitive modeling. In particula...
متن کاملNative-like Event-related Potentials in Processing the Second Language Syntax: Late Bilinguals
Background: The P600 brain wave reflects syntactic processes in response to different first language (L1) syntactic violations, syntactic repair, structural reanalysis, and specific semantic components. Unlike semantic processing, aspects of the second language (L2) syntactic processing differ from the L1, particularly at lower levels of proficiency. At higher L2 proficiency, syntactic violatio...
متن کاملTowards a dynamic constituency model of syntax
Incrementality is a basic feature of the human language processor. There is a considerable amount of objective data that humans build-up interpretations of the sentences before perceiving the end of the sentence (Marslen-Wilson, 1973; Kamide et al., 2003), and there is also evidence to support the idea that such incremental interpretation is based on fast, incremental construction of syntactic ...
متن کامل